Learning Summary Statistic for Approximate Bayesian Computation via Deep Neural Network
نویسندگان
چکیده
Approximate Bayesian Computation (ABC) methods are used to approximate posterior distributions in models with unknown or computationally intractable likelihoods. Both the accuracy and computational efficiency of ABC depend on the choice of summary statistic, but outside of special cases where the optimal summary statistics are known, it is unclear which guiding principles can be used to construct effective summary statistics. In this paper we explore the possibility of automating the process of constructing summary statistics by training deep neural networks to predict the parameters from artificially generated data: the resulting summary statistics are approximately posterior means of the parameters. With minimal model-specific tuning, our method constructs summary statistics for the Ising model and the moving-average model, which match or exceed theoreticallymotivated summary statistics in terms of the accuracies of the resulting posteriors.
منابع مشابه
Deep Generative Vision as Approximate Bayesian Computation
Probabilistic formulations of inverse graphics have recently been proposed for a variety of 2D and 3D vision problems [15, 12, 14, 9]. These approaches represent visual elements in form of graphics simulators that produce approximate renderings of the visual scenes. Existing approaches either model pixel data or hand-crafted intermediate representations such as edge maps, super-pixels, silhouet...
متن کاملLearning Summary Statistics for Approximate Bayesian Computation
In high dimensional data, it is often very difficult to analytically evaluate the likelihood function, and thus hard to get a Bayesian posterior estimation. Approximate Bayesian Computation is an important algorithm in this application. However, to apply the algorithm, we need to compress the data into low dimensional summary statistics, which is typically hard to get in an analytical form. In ...
متن کاملLearning from LDA Using Deep Neural Networks
Latent Dirichlet Allocation (LDA) is a three-level hierarchical Bayesian model for topic inference. In spite of its great success, inferring the latent topic distribution with LDA is time-consuming. Motivated by the transfer learning approach proposed by Hinton et al. (2015), we present a novel method that uses LDA to supervise the training of a deep neural network (DNN), so that the DNN can ap...
متن کاملK2-ABC: Approximate Bayesian Computation with Infinite Dimensional Summary Statistics via Kernel Embeddings
Complicated generative models often result in a situation where computing the likelihood of observed data is intractable, while simulating from the conditional density given a parameter value is relatively easy. Approximate Bayesian Computation (ABC) is a paradigm that enables simulation-based posterior inference in such cases by measuring the similarity between simulated and observed data in t...
متن کاملBayesian Hypernetworks
We propose Bayesian hypernetworks: a framework for approximate Bayesian inference in neural networks. A Bayesian hypernetwork, h, is a neural network which learns to transform a simple noise distribution, p( ) = N (0, I), to a distribution q(θ) . = q(h( )) over the parameters θ of another neural network (the "primary network"). We train q with variational inference, using an invertible h to ena...
متن کامل